Mixing Implies Lower Bounds for Space Bounded Learning
نویسندگان
چکیده
One can learn any hypothesis class H with O(log |H|) labeled examples. Alas, learning with so few examples requires saving the examples in memory, and this requires |X | memory states, where X is the set of all labeled examples. This motivates the question of how many labeled examples are needed in case the memory is bounded. Previous work showed, using techniques such as linear algebra and Fourier analysis, that parities cannot be learned with bounded memory and less than |H| examples. One might wonder whether a general combinatorial condition exists for unlearnability with bounded memory, as we have with the condition V Cdim(H) =∞ for PAC unlearnability. In this paper we give such a condition. We show that if an hypothesis class H, when viewed as a bipartite graph between hypotheses H and labeled examples X , is mixing, then learning it requires |H| examples under a certain bound on the memory. Note that the class of parities is mixing. Moreover, as an immediate corollary, we get that most hypothesis classes are unlearnable with bounded memory. Our proof technique is combinatorial in nature and very different from previous analyses.
منابع مشابه
Mixing Implies Strong Lower Bounds for Space Bounded Learning
With any hypothesis class one can associate a bipartite graph whose vertices are the hypotheses H on one side and all possible labeled examples X on the other side, and an hypothesis is connected to all the labeled examples that are consistent with it. We call this graph the hypotheses graph. We prove that any hypothesis class whose hypotheses graph is mixing cannot be learned using less than 2...
متن کاملEntropy Samplers and Strong Generic Lower Bounds For Space Bounded Learning
With any hypothesis class one can associate a bipartite graph whose vertices are the hypothesesH on one side and all possible labeled examples X on the other side, and an hypothesis is connected to all the labeled examples that are consistent with it. We call this graph the hypotheses graph. We prove that any hypothesis class whose hypotheses graph is mixing cannot be learned using less than Ω(...
متن کاملThe University of Chicago Algorithmic Stability and Ensemble-based Learning a Dissertation Submitted to the Faculty of the Division of the Physical Sciences in Candidacy for the Degree of Doctor of Philosophy Department of Computer Science by Samuel Kutin
We explore two themes in formal learning theory. We begin with a detailed, general study of the relationship between the generalization error and stability of learning algorithms. We then examine ensemble-based learning from the points of view of stability, decorrelation, and threshold complexity. A central problem of learning theory is bounding generalization error. Most such bounds have been ...
متن کاملStability Bounds for Stationary phi-mixing and beta-mixing Processes
Most generalization bounds in learning theory are based on some measure of the complexity of the hypothesis class used, independently of any algorithm. In contrast, the notion of algorithmic stability can be used to derive tight generalization bounds that are tailored to specific learning algorithms by exploiting their particular properties. However, as in much of learning theory, existing stab...
متن کاملStability Bounds for Stationary φ-mixing and β-mixing Processes
Most generalization bounds in learning theory are based on some measure of the complexity of the hypothesis class used, independently of any algorithm. In contrast, the notion of algorithmic stability can be used to derive tight generalization bounds that are tailored to specific learning algorithms by exploiting their particular properties. However, as in much of learning theory, existing stab...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Electronic Colloquium on Computational Complexity (ECCC)
دوره 24 شماره
صفحات -
تاریخ انتشار 2017